Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add partial precomputation support #546

Closed
wants to merge 1 commit into from
Closed

Add partial precomputation support #546

wants to merge 1 commit into from

Conversation

AaronFeickert
Copy link
Contributor

@AaronFeickert AaronFeickert commented Jul 5, 2023

Currently, using precomputation for variable-time multiscalar multiplication requires the number of precomputed points and static scalars to be the same; otherwise, the relevant functions will panic.

This limits some use cases of interest. For example, a Bulletproofs+ range proving implementation was made more efficient by precomputing a large set of curve group generators, which allowed for verification of shorter proofs that don't need to use all of them. (Interestingly, the technique applies equally well to the Bulletproofs range proving system, but that's for another day!)

There are probably several ways to support this, but the most straightforward seems to be simply relaxing the panic condition. This PR does precisely that. Providing a smaller number of static scalars will simply use only the corresponding precomputed points when evaluating a multiscalar multiplication. Documentation is updated accordingly.

Comments welcome!

@elichai
Copy link
Contributor

elichai commented Jul 27, 2023

Adding Clone support makes it possible to do things like Arc wrapping, where derived cloning (which doesn't actually clone the wrapped type) requires Clone support all the way down. While you'd almost certainly never want to actually clone a set of tables, this seemed like a much easier way to support reuse without a major refactor.

Arc doesn't require that T: Clone See the implementation and an example

@AaronFeickert
Copy link
Contributor Author

AaronFeickert commented Jul 29, 2023

The specific context I was considering involved the use of a generic Arc whose trait bound includes VartimePrecomputedMultiscalarMul, and which does require underlying Clone support or a manual Clone implementation for a containing struct.

@elichai
Copy link
Contributor

elichai commented Jul 30, 2023

The specific context I was considering involved the use of a generic Arc whose trait bound includes VartimePrecomputedMultiscalarMul, and which does require underlying Clone support or a manual Clone implementation for a containing struct.

Could you give a code example for this pattern?

@AaronFeickert
Copy link
Contributor Author

Apologies that it's not a MWE, but this pull request helps to demonstrate the issue. A derived clone of a struct containing an Arc with a VartimePrecomputedMultiscalarMul trait bound apparently must also enforce Clone on the underlying precomputation struct.

The alternate solution shown in that PR is to do manual Arc cloning, which works as expected. Apparently it's the generic with trait bound that screws up the derived cloning. It seemed more straightforward to me to support Clone at the curve library level, but I can understand a desire to avoid cloning due to table sizes.

Anyway, happy to remove Clone if it's considered a blocker here.

@AaronFeickert
Copy link
Contributor Author

Is cloning holding this up? Happy to remove it if so.

@tarcieri
Copy link
Contributor

@AaronFeickert there's a large backlog of PRs it's going to take awhile to work through

@AaronFeickert
Copy link
Contributor Author

@tarcieri no problem at all! Just wanted to check if the earlier discussion was blocking anything with this.

@AaronFeickert
Copy link
Contributor Author

Removed cloning and rebased.

@AaronFeickert
Copy link
Contributor Author

Any updates on this? Would appreciate any feedback from @tarcieri et al. as time permits.

@AaronFeickert
Copy link
Contributor Author

This has been rebased to be up to date against recent changes.

Would greatly appreciate brief review from @tarcieri or another maintainer on whether or not this change seems reasonable for merge.

@AaronFeickert AaronFeickert closed this by deleting the head repository Jul 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants